Baichuan-7B is an open-source large-scale pre-trained language model developed by Baichuan Intelligence, based on the Transformer architecture with 7 billion parameters. Trained on bilingual Chinese-English corpus, it supports a context window of 4096 tokens.
Large Language Model
Transformers Supports Multiple Languages